- Renyi entropy
- энтропия f Реньи
English-Russian Dictionary on Probability, Statistics, and Combinatorics. — Philadelphia and Moscow. Society for Industrial and Applied Mathematics and TVP Science Publishers. K. A. Borovkov. 1994.
English-Russian Dictionary on Probability, Statistics, and Combinatorics. — Philadelphia and Moscow. Society for Industrial and Applied Mathematics and TVP Science Publishers. K. A. Borovkov. 1994.
Rényi entropy — In information theory, the Rényi entropy, a generalisation of Shannon entropy, is one of a family of functionals for quantifying the diversity, uncertainty or randomness of a system. It is named after Alfréd Rényi. The Rényi entropy of order α,… … Wikipedia
Entropy — This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… … Wikipedia
Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… … Wikipedia
Entropy (disambiguation) — Additional relevant articles may be found in the following categories: Thermodynamic entropy Entropy and information Quantum mechanical entropy Entropy, in thermodynamics, is a measure of the energy in a thermodynamic system not available to do… … Wikipedia
Entropy (general concept) — In many branches of science, entropy refers to a certain measure of the disorder of a system. Entropy is particularly notable as it has a broad, common definition that is shared across physics, mathematics and information science. Although the… … Wikipedia
Min-entropy — In probability theory or information theory, the min entropy of a discrete random event x with possible states (or outcomes) 1... n and corresponding probabilities p1... pn is The base of the logarithm is just a scaling constant; for a… … Wikipedia
Binary entropy function — In information theory, the binary entropy function, denoted H(p) , or H {mathrm b}(p) ,, is defined as the entropy of a Bernoulli trial with probability of success p . Mathematically, the Bernoulli trial is modelled as a random variable X that… … Wikipedia
Tsallis entropy — In physics, the Tsallis entropy is a generalization of the standard Boltzmann Gibbs entropy. It was an extension put forward by Constantino Tsallis in 1988. It is defined as:S q(p) = {1 over q 1} left( 1 int p^q(x), dx ight),or in the discrete… … Wikipedia
Alfréd Rényi — (20 March 1921 ndash; 1 February 1970) was a Hungarian mathematician who made contributions in combinatorics and graph theory but mostly in probability theory. [citation|title=Obituary: Alfred Renyi|first=David|last=Kendall|journal=Journal of… … Wikipedia
Alfred Renyi — Alfréd Rényi Alfred Renyi. Alfréd Rényi (20 mars 1921 1er février 1970) était un mathématicien hongrois. Ses contributions sont surtout en combinatoire, en théorie des graphes et en théorie des probabili … Wikipédia en Français
Alfréd Rényi — Alfred Renyi. Alfréd Rényi (20 mars 1921 1er février 1970) était un mathématicien hongrois. Ses contributions sont surtout en combinatoire, en théorie des graphes et en théorie des probabilités. En 1950, il a fondé l Institut de… … Wikipédia en Français